1,700 research outputs found

    Fine-grained Haptics: Sensing and Actuating Haptic Primary Colours (force, vibration, and temperature)

    Get PDF
    This thesis discusses the development of a multimodal, fine-grained visual-haptic system for teleoperation and robotic applications. This system is primarily composed of two complementary components: an input device known as the HaptiTemp sensor (combines ā€œHapticsā€ and ā€œTemperatureā€), which is a novel thermosensitive GelSight-like sensor, and an output device, an untethered multimodal finegrained haptic glove. The HaptiTemp sensor is a visuotactile sensor that can sense haptic primary colours known as force, vibration, and temperature. It has novel switchable UV markers that can be made visible using UV LEDs. The switchable markers feature is a real novelty of the HaptiTemp because it can be used in the analysis of tactile information from gel deformation without impairing the ability to classify or recognise images. The use of switchable markers in the HaptiTemp sensor is the solution to the trade-off between marker density and capturing high-resolution images using one sensor. The HaptiTemp sensor can measure vibrations by counting the number of blobs or pulses detected per unit time using a blob detection algorithm. For the first time, temperature detection was incorporated into a GelSight-like sensor, making the HaptiTemp sensor a haptic primary colours sensor. The HaptiTemp sensor can also do rapid temperature sensing with a 643 ms response time for the 31Ā°C to 50Ā°C temperature range. This fast temperature response of the HaptiTemp sensor is comparable to the withdrawal reflex response in humans. This is the first time a sensor can trigger a sensory impulse that can mimic a human reflex in the robotic community. The HaptiTemp sensor can also do simultaneous temperature sensing and image classification using a machine vision cameraā€”the OpenMV Cam H7 Plus. This capability of simultaneous sensing and image classification has not been reported or demonstrated by any tactile sensor. The HaptiTemp sensor can be used in teleoperation because it can communicate or transmit tactile analysis and image classification results using wireless communication. The HaptiTemp sensor is the closest thing to the human skin in tactile sensing, tactile pattern recognition, and rapid temperature response. In order to feel what the HaptiTemp sensor is touching from a distance, a corresponding output device, an untethered multimodal haptic hand wearable, is developed to actuate the haptic primary colours sensed by the HaptiTemp sensor. This wearable can communicate wirelessly and has fine-grained cutaneous feedback to feel the edges or surfaces of the tactile images captured by the HaptiTemp sensor. This untethered multimodal haptic hand wearable has gradient kinesthetic force feedback that can restrict finger movements based on the force estimated by the HaptiTemp sensor. A retractable string from an ID badge holder equipped with miniservos that control the stiffness of the wire is attached to each fingertip to restrict finger movements. Vibrations detected by the HaptiTemp sensor can be actuated by the tapping motion of the tactile pins or by a buzzing minivibration motor. There is also a tiny annular Peltier device, or ThermoElectric Generator (TEG), with a mini-vibration motor, forming thermo-vibro feedback in the palm area that can be activated by a ā€˜hotā€™ or ā€˜coldā€™ signal from the HaptiTemp sensor. The haptic primary colours can also be embedded in a VR environment that can be actuated by the multimodal hand wearable. A VR application was developed to demonstrate rapid tactile actuation of edges, allowing the user to feel the contours of virtual objects. Collision detection scripts were embedded to activate the corresponding actuator in the multimodal haptic hand wearable whenever the tactile matrix simulator or hand avatar in VR collides with a virtual object. The TEG also gets warm or cold depending on the virtual object the participant has touched. Tests were conducted to explore virtual objects in 2D and 3D environments using Leap Motion control and a VR headset (Oculus Quest 2). Moreover, a fine-grained cutaneous feedback was developed to feel the edges or surfaces of a tactile image, such as the tactile images captured by the HaptiTemp sensor, or actuate tactile patterns in 2D or 3D virtual objects. The prototype is like an exoskeleton glove with 16 tactile actuators (tactors) on each fingertip, 80 tactile pins in total, made from commercially available P20 Braille cells. Each tactor can be controlled individually to enable the user to feel the edges or surfaces of images, such as the high-resolution tactile images captured by the HaptiTemp sensor. This hand wearable can be used to enhance the immersive experience in a virtual reality environment. The tactors can be actuated in a tapping manner, creating a distinct form of vibration feedback as compared to the buzzing vibration produced by a mini-vibration motor. The tactile pin height can also be varied, creating a gradient of pressure on the fingertip. Finally, the integration of the high-resolution HaptiTemp sensor, and the untethered multimodal, fine-grained haptic hand wearable is presented, forming a visuotactile system for sensing and actuating haptic primary colours. Force, vibration, and temperature sensing tests with corresponding force, vibration, and temperature actuating tests have demonstrated a unified visual-haptic system. Aside from sensing and actuating haptic primary colours, touching the edges or surfaces of the tactile images captured by the HaptiTemp sensor was carried out using the fine-grained cutaneous feedback of the haptic hand wearable

    Visuotactile Sensors with Emphasis on GelSight Sensor: A Review

    Get PDF
    This review paper focuses on vision and touch-based sensors known as visuotactile. The study of visuotactile sensation and perception became a multidisciplinary field of study by philosophers, psychologists, biologists, engineers, technologists, and roboticists in the fields of haptics, machine vision, and artificial intelligence and it dates back centuries. To the best of our knowledge, the earliest records of visuotactile sensor was not applied to robotics and was not even for hand or finger imprint analysis yet for recording the foot pressure distribution of a walking or standing human known as pedobarograph. Our review paper presents the different literature related to visuotactile sensors that lead to a high-resolution miniature pedobarographlike sensor known as the GelSight sensor. Moreover, this review paper focuses on architecture, different techniques, hardware, and software development of GelSight sensor since 2009 with its applications in haptics, robotics, and computer vision

    Low-cost GelSight with UV Markings: Feature Extraction of Objects Using AlexNet and Optical Flow without 3D Image Reconstruction

    Get PDF
    GelSight sensor has been used to study microgeometry of objects since 2009 in tactile sensing applications. Elastomer, reflective coating, lighting, and camera were the main challenges of making a GelSight sensor within a short period. The recent addition of permanent markers to the GelSight was a new era in shear/slip studies. In our previous studies, we introduced Ultraviolet (UV) ink and UV LEDs as a new form of marker and lighting respectively. UV ink markers are invisible using ordinary LED but can be made visible using UV LED. Currently, recognition of objects or surface textures using GelSight sensor is done using fusion of camera-only images and GelSight captured images with permanent markings. Those images are fed to Convolutional Neural Networks (CNN) to classify objects. However, our novel approach in using low-cost GelSight sensor with UV markings, the 3D height map to 2D image conversion, and the additional non-Gelsight captured images for training the CNN can be eliminated. AlexNet and optical flow algorithm have been used for feature recognition of five coins without UV markings and shear/slip of the coin in GelSight with UV markings respectively. Our results on confusion matrix show that, on average coin recognition can reach 93.4% without UV markings using AlexNet. Therefore, our novel method of using GelSight with UV markings would be useful to recognize full/partial object, shear/slip, and force applied to the objects without any 3D image reconstruction

    HaptiTemp: A Next-Generation Thermosensitive GelSight-like Visuotactile Sensor

    Get PDF
    This study describes the creation of a new type of compact skin-like silicone-based thermosensitive visuotactile sensor based on GelSight technology. The easy integration of this novel sensor into a complex visuotactile system capable of very rapid detection of temperature change (30Ā°C/s) is unique in providing a system that parallels the withdrawal reflex of the human autonomic system to extreme heat. To the best of authorsā€™ awareness, this is the first time a sensor that can trigger a sensory impulse like a withdrawal reflex of humans in robotic community. To attain this, we used thermochromic pigments color blue, orange, and black with a threshold of 31Ā°C, 43Ā°C, and 50Ā°C, respectively on the gel material. Each pigment has the property of becoming translucent when its temperature threshold is reached, making it possible to stack thermochromic pigments of different colors and thresholds. The pigments were air-brushed on a low-cost commercially available transparent silicone sponge. We used MobileNetV2 and transfer learning to simulate tactile preprocessing in order to recognize five different objects. The new thermosensitive visuotactile sensor helped to achieve 97.3% tactile image classification accuracy of five different objects. Our novel thermosensitive visuotactile sensor could be of benefit in material texture analysis, telerobotics, space exploration, and medical applications

    A Novel Untethered Hand Wearable with Fine-Grained Cutaneous Haptic Feedback

    Get PDF
    During open surgery, a surgeon relies not only on the detailed view of the organ being operated upon and on being able to feel the fine details of this organ but also heavily relies on the combination of these two senses. In laparoscopic surgery, haptic feedback provides surgeons information on interaction forces between instrument and tissue. There have been many studies to mimic the haptic feedback in laparoscopic-related telerobotics studies to date. However, cutaneous feedback is mostly restricted or limited in haptic feedback-based minimally invasive studies. We argue that fine-grained information is needed in laparoscopic surgeries to study the details of the instrumentā€™s end and can convey via cutaneous feedback. We propose an exoskeleton haptic hand wearable which consists of five 4 ā‡„ 4 miniaturized fingertip actuators, 80 in total, to convey cutaneous feedback. The wearable is described as modular, lightweight, Bluetooth, and WiFi-enabled, and has a maximum power consumption of 830 mW. Software is developed to demonstrate rapid tactile actuation of edges; this allows the user to feel the contours in cutaneous feedback. Moreover, to demonstrate the idea as an object displayed on a flat monitor, initial tests were carried out in 2D. In the second phase, the wearable exoskeleton glove is then further developed to feel 3D virtual objects by using a virtual reality (VR) headset demonstrated by a VR environment. Two-dimensional and 3D objects were tested by our novel untethered haptic hand wearable. Our results show that untethered humans understand actuation in cutaneous feedback just in a single tapping with 92.22% accuracy. Our wearable has an average latency of 46.5 ms, which is much less than the 600 ms tolerable delay acceptable by a surgeon in teleoperation. Therefore, we suggest our untethered hand wearable to enhance multimodal perception in minimally invasive surgeries to naturally feel the immediate environments of the instruments

    Pilot Study: Low Cost GelSight Sensor

    Get PDF
    GelSight sensor and related technology have been studied a decade to the date. It was proven that it is worth to explore in many haptics and tactile sensing applications. Elastomer, reflective coating, lighting, and camera were the main challenges of making a GelSight sensor within a short period. In this workshop paper, we present our preliminary studies on how to make a GelSight sensor using low cost material. In this study, we used a clear silicone cosmetic sponge as the elastomeric slab and that skipped the degassing process and hours of curing time in making it. Moreover, we used Psycho PaintĀ® for the reflective coating, Light Emitting Diodes (LEDs) for the lighting, and Logitech C270 webcam for our experimental setup. Furthermore, in this study Ultraviolet (UV) ink and UV LEDs have been tested as a marker for the reflective coating and lighting respectively. UV ink markers are invisible using ordinary LED but can be made visible using UV lighting. Comparable results have been found to show the effectiveness of our setup

    An Untethered Multimodal Haptic Hand Wearable

    Get PDF
    Haptic primary colors correspond to temperature, vibration, and force. Previous studies combined these three haptic primary colors to produce different types of cutaneous sensations without the need to touch a real object. This study presents a low-cost untethered hand wearable with temperature, vibration, and force feedback. It is made from low-cost and commercial off-the-shelf components. A 26 mm annular Peltier element with a 10 mm hole is coupled to an 8 mm mini disc vibration motor, forming vibro-thermal tactile feedback for the user. All the other fingertips have an 8 mm disc vibration motor strapped on them using Velcro. Moreover, kinesthetic feedback extracted from a retractable ID badge holder with a small solenoid stopper is used as force feedback that restricts the fingersā€™ movement. Hand and finger tracking is done using Leap Motion Controller interfaced to a virtual setup with different geometric figures developed using Unity software. Therefore, we argue this prototype as a whole actuates cutaneous and kinesthetic feedback that would be useful in many virtual applications such as Virtual Reality (VR), teleoperated surgeries, and teleoperated farming and agriculture

    Pilot Study: A Visuotactile Haptic Primary Colors Sensor

    Get PDF
    In this paper, we present our preliminary studies on how to make a unified skin-like visuotactile sensor capable of sensing haptic primary colors, namely: force, vibration, and temperature. Our sensor is based on GelSight technology that has proven its worth in the field of haptics, robotics, and computer vision. In our previous studies, we proposed switchable UltraViolet (UV) markers that can be turned on using UV light. These markers can be tracked using an optical flow algorithm to visualize forces related to gel deformation. In this study, we introduced layers of thermochromic pigments on the reflective layer, making our visuotactile sensor capable of sensing not only force and vibration inferred from gel deformation but can also sense the temperature of the contacted object by analyzing the change of hue in the reflective coatin

    4x4 Fingertip Tactile Matrix Actuator with Edge Detection Scanning ROI Simulator

    Get PDF
    This study presents a novel 4x4 fingertip tactile matrix actuator that can be strapped on a finger. It is made from Dot Braille cells purchased from Dot Inc., Korea. The prototype has a surface area of 1.08 cm2 with a pin pitch of 2.6 mm and operates at 5V supply. Each tactile pin can be controlled using an h-bridge motor driver and Arduino microcontroller. The tactile matrix is coupled with a tactile matrix simulator that scans a binary image or edges of an image using Canny edge detector. The simulator has 16 sections corresponding to the 16 actuator pins. The integration of the simulator to the hardware prototype allows the user to feel a binary image of a plane geometric figure or to feel the edges of an image as the scanning region of interest (ROI) moves across the visual screen. This fingertip tactile matrix display would be useful in many Virtual Reality (VR) applications to provide tactile feedback on the textures of virtual objects. Therefore, the authors suggest that this device will be beneficial in many applications such as virtual surgery, virtual fashion, remote sensing, and telerobotics
    • ā€¦
    corecore